Bayesian Shrinkage Variable Selection
نویسندگان
چکیده
We introduce a new Bayesian approach to the variable selection problem which we term Bayesian Shrinkage Variable Selection (BSVS ). This approach is inspired by the Relevance Vector Machine (RVM ), which uses a Bayesian hierarchical linear setup to do variable selection and model estimation. RVM is typically applied in the context of kernel regression although it is also suitable in the standard regression context. Extending the RVM algorithm, we include a proper prior distribution for the precisions of the regression coefficients, v j ∼ f(v −1 j |η), where η is a scaler hyperparameter. Based upon this model, we derive the full set of conditional distributions for parameters as would typically be done when applying Gibbs sampling. However, instead of simulating samples from the joint posterior distribution in order to estimate the posterior means of the parameters, we use the full conditionals in order to find the joint maximum of the posterior distribution p(β, σ2,V|y, η) given the value of the hyper-parameter η. While the models with η = 0 result in an “RVM -like” solution, those with η > 0 reinforce further shrinkage leading to more parsimonious models with smaller MSE and prediction errors than traditional RVM models. η is estimated via maximizing the marginal likelihood, i.e. Copyright c © 2007 Artin Armagan and Russell L. Zaretzki Ph.D. Student in Statistics, The University of Tennessee, Knoxville, USA Assistant Professor in Statistics, The University of Tennessee, Knoxville, USA
منابع مشابه
Bayesian Shrinkage Variable Selection April 25 , 2008
We introduce a new Bayesian approach to the variable selection problem which we term Bayesian Shrinkage Variable Selection (BSVS ). This approach is inspired by the Relevance Vector Machine (RVM ), which uses a Bayesian hierarchical linear setup to do variable selection and model estimation. RVM is typically applied in the context of kernel regression although it is also suitable in the standar...
متن کاملDECOUPLING SHRINKAGE AND SELECTION IN BAYESIAN LINEAR MODELS: A POSTERIOR SUMMARY PERSPECTIVE By P. Richard Hahn and Carlos M. Carvalho Booth School of Business and McCombs School of Business
LINEAR MODELS: A POSTERIOR SUMMARY PERSPECTIVE By P. Richard Hahn and Carlos M. Carvalho Booth School of Business and McCombs School of Business Selecting a subset of variables for linear models remains an active area of research. This paper reviews many of the recent contributions to the Bayesian model selection and shrinkage prior literature. A posterior variable selection summary is proposed...
متن کاملA Review of Bayesian Variable Selection Methods: What, How and Which
The selection of variables in regression problems has occupied the minds of many statisticians. Several Bayesian variable selection methods have been developed, and we concentrate on the following methods: Kuo & Mallick, Gibbs Variable Selection (GVS), Stochastic Search Variable Selection (SSVS), adaptive shrinkage with Jeffreys’ prior or a Laplacian prior, and reversible jump MCMC. We review t...
متن کاملBayesian Variable Selection in Semiparametric Proportional Hazards Model for High Dimensional Survival Data
Variable selection for high dimensional data has recently received a great deal of attention. However, due to the complex structure of the likelihood, only limited developments have been made for time-to-event data where censoring is present. In this paper, we propose a Bayesian variable selection scheme for a Bayesian semiparametric survival model for right censored survival data sets. A speci...
متن کاملBayesian Computation and the Linear Model
This paper is a review of computational strategies for Bayesian shrinkage and variable selection in the linear model. Our focus is less on traditional MCMC methods, which are covered in depth by earlier review papers. Instead, we focus more on recent innovations in stochastic search and adaptive MCMC, along with some comparatively new research on shrinkage priors. One of our conclusions is that...
متن کامل